Chapter 1: Sub-Gaussian Random Variables

ثبت نشده
چکیده

where μ = IE(X) ∈ IR and σ = var(X) > 0 are the mean and variance of X . We write X ∼ N (μ, σ). Note that X = σZ + μ for Z ∼ N (0, 1) (called standard Gaussian) and where the equality holds in distribution. Clearly, this distribution has unbounded support but it is well known that it has almost bounded support in the following sense: IP(|X −μ| ≤ 3σ) ≃ 0.997. This is due to the fast decay of the tails of p as |x| → ∞ (see Figure 1.1). This decay can be quantified using the following proposition (Mills inequality).

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Complete convergence of moving-average processes under negative dependence sub-Gaussian assumptions

The complete convergence is investigated for moving-average processes of doubly infinite sequence of negative dependence sub-gaussian random variables with zero means, finite variances and absolutely summable coefficients. As a corollary, the rate of complete convergence is obtained under some suitable conditions on the coefficients.

متن کامل

Supremum of Random Dirichlet Polynomials with Sub-multiplicative Coefficients

We study the supremum of random Dirichlet polynomials DN (t) = ∑ N n=1 εnd(n)n , where (εn) is a sequence of independent Rademacher random variables, and d is a sub-multiplicative function. The approach is gaussian and entirely based on comparison properties of Gaussian processes, with no use of the metric entropy method.

متن کامل

Supplement to ’ Sparse recovery by thresholded non - negative least squares ’

We here provide additional proofs, definitions, lemmas and derivations omitted in the paper. Note that material contained in the latter are referred to by the captions used there (e.g. Theorem 1), whereas auxiliary statements contained exclusively in this supplement are preceded by a capital Roman letter (e.g. Theorem A.1). A Sub-Gaussian random variables and concentration inequalities A random...

متن کامل

Hanson-wright Inequality and Sub-gaussian Concentration

In this expository note, we give a modern proof of Hanson-Wright inequality for quadratic forms in sub-gaussian random variables. We deduce a useful concentration inequality for sub-gaussian random vectors. Two examples are given to illustrate these results: a concentration of distances between random vectors and subspaces, and a bound on the norms of products of random and deterministic matric...

متن کامل

Introduction to the non-asymptotic analysis of random matrices

2 Preliminaries 7 2.1 Matrices and their singular values . . . . . . . . . . . . . . . . . . 7 2.2 Nets . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8 2.3 Sub-gaussian random variables . . . . . . . . . . . . . . . . . . . 9 2.4 Sub-exponential random variables . . . . . . . . . . . . . . . . . . 14 2.5 Isotropic random vectors . . . . . . . . . . . . . . . . . . . . . . ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015